Crowd-Machine Collaboration for Item Screening

نویسندگان

  • Evgeny Krivosheev
  • Bahareh Harandizadeh
  • Fabio Casati
  • Boualem Benatallah
چکیده

In this paper we describe how crowd and machine classifier can be efficiently combined to screen items that satisfy a set of predicates. We show that this is a recurring problem in many domains, present machine-human (hybrid) algorithms that screen items efficiently and estimate the gain over human-only or machine-only screening in terms of performance and cost.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Improving Machine Learning Ability with Fine-Tuning

Item Response Theory (IRT) allows for measuring ability of Machine Learning models as compared to a human population. However, it is difficult to create a large dataset to train the ability of deep neural network models (DNNs). We propose Crowd-Informed Fine-Tuning (CIFT) as a new training process, where a pre-trained model is fine-tuned with a specialized supplemental training set obtained via...

متن کامل

Teams vs. Crowds: a Field Test of the Relative Contribution of Incentives, Member Ability, and Emergent Collaboration to Crowd-based Problem Solving Performance

Organizations are increasingly turning to crowdsourcing to solve difficult problems. This is often driven by the desire to find the best subject matter experts, strongly incentivize them, and engage them with as little coordination cost as possible. A growing number of authors, however, are calling for increased collaboration in crowdsourcing settings, hoping to draw upon the advantages of team...

متن کامل

Combining Crowd and Expert Labels Using Decision Theoretic Active Learning

We consider a finite-pool data categorization scenario which requires exhaustively classifying a given set of examples with a limited budget. We adopt a hybrid human-machine approach which blends automatic machine learning with human labeling across a tiered workforce composed of domain experts and crowd workers. To effectively achieve high-accuracy labels over the instances in the pool at mini...

متن کامل

Hybrid Crowd-Machine Methods as Alternatives to Pooling and Expert Judgments

Pooling is a document sampling strategy commonly used to collect relevance judgments when multiple retrieval/ranking algorithms are involved. A fixed number of top ranking documents from each algorithm form a pool. Traditionally, expensive experts judge the pool of documents for relevance. We propose and test two hybrid algorithms as alternatives that reduce assessment costs and are effective. ...

متن کامل

Tracking Human Process Using Crowd Collaboration to Enrich Data

A rich source of data that has been largely ignored in crowdsourcing is the processes that humans use to accomplish a task. If we can capture this information, we could model it for automatic processing and use it to better understand the phenomena being modeled. Using crowd collaboration to trace workers’ process produces a rich dataset that can be mined for new insights. We tested this approa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018